277 research outputs found

    Implementing the EffTox dose-finding design in the Matchpoint trial

    Get PDF
    Background: The Matchpoint trial aims to identify the optimal dose of ponatinib to give with conventional chemotherapy consisting of fludarabine, cytarabine and idarubicin to chronic myeloid leukaemia patients in blastic transformation phase. The dose should be both tolerable and efficacious. This paper describes our experience implementing EffTox in the Matchpoint trial. Methods: EffTox is a Bayesian adaptive dose-finding trial design that jointly scrutinises binary efficacy and toxicity outcomes. We describe a nomenclature for succinctly describing outcomes in phase I/II dose-finding trials. We use dose-transition pathways, where doses are calculated for each feasible set of outcomes in future cohorts. We introduce the phenomenon of dose ambivalence, where EffTox can recommend different doses after observing the same outcomes. We also describe our experiences with outcome ambiguity, where the categorical evaluation of some primary outcomes is temporarily delayed. Results: We arrived at an EffTox parameterisation that is simulated to perform well over a range of scenarios. In scenarios where dose ambivalence manifested, we were guided by the dose-transition pathways. This technique facilitates planning, and also helped us overcome short-term outcome ambiguity. Conclusions: EffTox is an efficient and powerful design, but not without its challenges. Joint phase I/II clinical trial designs will likely become increasingly important in coming years as we further investigate non-cytotoxic treatments and streamline the drug approval process. We hope this account of the problems we faced and the solutions we used will help others implement this dose-finding clinical trial design. Trial registration: Matchpoint was added to the European Clinical Trials Database (2012-005629-65) on 2013-12-30

    Screened selection design for randomised phase II oncology trials : an example in chronic lymphocytic leukaemia

    Get PDF
    BACKGROUND: As there are limited patients for chronic lymphocytic leukaemia trials, it is important that statistical methodologies in Phase II efficiently select regimens for subsequent evaluation in larger-scale Phase III trials. METHODS: We propose the screened selection design (SSD), which is a practical multi-stage, randomised Phase II design for two experimental arms. Activity is first evaluated by applying Simon’s two-stage design (1989) on each arm. If both are active, the play-the-winner selection strategy proposed by Simon, Wittes and Ellenberg (SWE) (1985) is applied to select the superior arm. A variant of the design, Modified SSD, also allows the arm with the higher response rates to be recommended only if its activity rate is greater by a clinically-relevant value. The operating characteristics are explored via a simulation study and compared to a Bayesian Selection approach. RESULTS: Simulations showed that with the proposed SSD, it is possible to retain the sample size as required in SWE and obtain similar probabilities of selecting the correct superior arm of at least 90%; with the additional attractive benefit of reducing the probability of selecting ineffective arms. This approach is comparable to a Bayesian Selection Strategy. The Modified SSD performs substantially better than the other designs in selecting neither arm if the underlying rates for both arms are desirable but equivalent, allowing for other factors to be considered in the decision making process. Though its probability of correctly selecting a superior arm might be reduced, it still performs reasonably well. It also reduces the probability of selecting an inferior arm. CONCLUSIONS: SSD provides an easy to implement randomised Phase II design that selects the most promising treatment that has shown sufficient evidence of activity, with available R codes to evaluate its operating characteristics

    Detecting discontinuities using nonparametric smoothing techniques in correlated data

    Get PDF
    There is increasing interest in the detection and estimation of discontinuities in regression problems with one and two covariates, due to its wide variety of applications. Moreover, in many real life applications, we are likely to encounter a certain degree of dependence in observations that are collected over time or space. Detecting changes in dependent data in the presence of a smoothly varying trend, is a much more complicated problem that previously has not been adequately studied. Hence, the aim of this thesis is to respond to the immense need for a nonparametric discontinuity test which is capable of incorporating robust estimation of the underlying dependence structure (if unknown) into the test procedure in one and two dimensions. By means of a difference-based method, using a local linear kernel smoothing technique, a global test of the hypothesis that an abrupt change is present in the smoothly varying mean level of a sequence of correlated data is developed in the one-dimensional setting. Accurate distributional calculations for the test statistic can be performed, using standard results on quadratic forms. Extensive simulations are carried out to examine the performance of the test in the cases both of correlation known and unknown. For the latter, the effectiveness of the different algorithms that have been devised to incorporate the estimation of correlation, for both the equally and unequally spaced designs, is investigated. Various factors that affect the size and power of the test are also explored. In addition, a small simulation study is performed to compare the proposed test with an isotonic regression test proposed by Wu et al. (2001). The utility of the techniques is demonstrated by applying the proposed discontinuity test to three sets of real-life data, namely the Argentina rainfall data, the global warming data and the River Clyde data. The analysis of the results are compared to those using the isotonic regression test of Wu et al. (2001) and the Bayesian test of Thomas (2001). Finally, the test is also extended to detect discontinuities in spatially correlated data. The same differencing principle as in the one-dimensional case is utilised here. However, the discontinuity in this context does not occur only at a point but over a smooth curve. Hence, the test has to take into account the additional element of direction. A two stage algorithm which makes use of a partitioning process to remove observations that are near the discontinuity curve is proposed. A motivating application for the approach is the analysis of radiometric data on cesium fallout in a particular area in Finland after a nuclear reactor accident in Chernobyl. The procedures outlined for both the one and two dimensional settings are particularly useful and relatively easy to implement. Although the main focus of the work is not to identify the exact locations of the discontinuities, useful graphical tools have been employed to infer their likely locations. The dissertation closes with a summary and discussion of the results presented, and proposes potential future work in this area

    A comparison of short-term and long-term air pollution exposure associations with mortality in two cohorts in Scotland

    Get PDF
    Air pollution–mortality risk estimates are generally larger at longer-term, compared with short-term, exposure time scales. We compared associations between short-term exposure to black smoke (BS) and mortality with long-term exposure–mortality associations in cohort participants and with short-term exposure–mortality associations in the general population from which the cohorts were selected. We assessed short-to-medium–term exposure–mortality associations in the Renfrew–Paisley and Collaborative cohorts (using nested case–control data sets), and compared them with long-term exposure–mortality associations (using a multilevel spatiotemporal exposure model and survival analyses) and short-to-medium–term exposure–mortality associations in the general population (using time-series analyses). For the Renfrew–Paisley cohort (15,331 participants), BS exposure–mortality associations were observed in nested case–control analyses that accounted for spatial variations in pollution exposure and individual-level risk factors. These cohort-based associations were consistently greater than associations estimated in time-series analyses using a single monitoring site to represent general population exposure {e.g., 1.8% [95% confidence interval (CI): 0.1, 3.4%] vs. 0.2% (95% CI: 0.0, 0.4%) increases in mortality associated with 10-μg/m3 increases in 3-day lag BS, respectively}. Exposure–mortality associations were of larger magnitude for longer exposure periods [e.g., 3.4% (95% CI: –0.7, 7.7%) and 0.9% (95% CI: 0.3, 1.5%) increases in all-cause mortality associated with 10-μg/m3 increases in 31-day BS in case–control and time-series analyses, respectively; and 10% (95% CI: 4, 17%) increase in all-cause mortality associated with a 10-μg/m3 increase in geometic mean BS for 1970–1979, in survival analysis]. After adjusting for individual-level exposure and potential confounders, short-term exposure–mortality associations in cohort participants were of greater magnitude than in comparable general population time-series study analyses. However, short-term exposure–mortality associations were substantially lower than equivalent long-term associations, which is consistent with the possibility of larger, more persistent cumulative effects from long-term exposures

    The effect of network externalities on the perception of a new service offering mobile banking

    Get PDF
    This study extends previous research by determining if potential adopters' perception of innovation characteristics mediates the effects of network externalities towards their intention to use mobile banking. The study also explores the possible moderating effects of technology anxiety between network externalities, innovation characteristics and intention to use mobile bankin
    corecore